Kullback proximal algorithims for maximum-likelihood estimation

نویسندگان

  • Stéphane Chrétien
  • Alfred O. Hero
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sequential noise compensation by a sequential kullback proximal algorithm

We present a sequential noise compensation method based on the sequential Kullback proximal algorithm, which uses the Kullback-Leibler divergence as a regularization function for the maximum likelihood estimation. The method is implemented as filters. In contrast to sequential noise compensation method based on the sequential EM algorithm, the convergence rate of the method and estimation error...

متن کامل

Noise Compensation by a Sequential Kullback Proximal Algorithm

We present sequential parameter estimation in the framework of the Hidden Markov Models. The sequential algorithm is a sequential Kullback proximal algorithm, which chooses the KullbackLiebler divergence as a penalty function for the maximum likelihood estimation. The scheme is implemented as £lters. In contrast to algorithms based on the sequential EM algorithm, the algorithm has faster conver...

متن کامل

Kullback Proximal Algorithms for Maximum Likelihood Estimation

Accelerated algorithms for maximum likelihood image reconstruction are essential for emerging applications such as 3D tomography, dynamic tomographic imaging, and other high dimensional inverse problems. In this paper, we introduce and analyze a class of fast and stable sequential optimization methods for computing maximum likelihood estimates and study its convergence properties. These methods...

متن کامل

Maximum Lq-Likelihood Estimation via the Expectation Maximization Algorithm: A Robust Estimation of Mixture Models

We introduce a maximum Lq-likelihood estimation (MLqE) of mixture models using our proposed expectation maximization (EM) algorithm, namely the EM algorithm with Lq-likelihood (EM-Lq). Properties of the MLqE obtained from the proposed EMLq are studied through simulated mixture model data. Compared with the maximum likelihood estimation (MLE) which is obtained from the EM algorithm, the MLqE pro...

متن کامل

Asymptotic analysis of covariance parameter estimation for Gaussian processes in the misspecified case

In parametric estimation of covariance function of Gaussian processes, it is often the case that the true covariance function does not belong to the parametric set used for estimation. This situation is called the misspecified case. In this case, it has been shown that, for irregular spatial sampling of observation points, Cross Validation can yield smaller prediction errors than Maximum Likeli...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 46  شماره 

صفحات  -

تاریخ انتشار 2000